Multi-Objective Crowd Worker Selection in Crowdsourced Testing

نویسندگان

  • Qiang Cui
  • Song Wang
  • Junjie Wang
  • Yuanzhe Hu
  • Qing Wang
  • Mingshu Li
چکیده

Crowdsourced testing is an emerging trend in software testing, which relies on crowd workers to accomplish test tasks. Typically, a crowdsourced testing task aims to detect as many bugs as possible within a limited budget. For a specific test task, not all crowd workers are qualified to perform it, and different test tasks require crowd workers to have different experiences, domain knowledge, etc. Inappropriate workers may miss true bugs, introduce false bugs, or report duplicated bugs, which could not only decrease the quality of test outcomes, but also increase the cost of hiring workers. Thus, how to select the appropriate crowd workers for specific test tasks is a challenge in crowdsourced testing. This paper proposes a Multi-Objective crowd wOrker SElection approach (MOOSE), which includes three objectives: maximizing the coverage of test requirement, minimizing the cost, and maximizing bug-detection experience of the selected crowd workers. Specifically, MOOSE leverages NSGA-II, a widely used multi-objective evolutionary algorithm, to optimize the three objectives when selecting workers. We evaluate MOOSE on 42 test tasks (involve 844 crowd workers and 3,984 test reports) from one of the largest crowdsourced testing platforms in China, and the experimental results show MOOSE could improve the best baseline by 17% on average in bug detection rate.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

It's getting crowded!: improving the effectiveness of microtask crowdsourcing

Microtask crowdsouring has emerged as an excellent means to acquire human input on demand, and has found widespread application in solving a variety of problems. Popular examples include surveys, content creation, acquisition of image annotations, etc. However, there are a number of challenges that need to be overcome to realize the true potential of this paradigm. With an aim to improve the ef...

متن کامل

Crowd-Selection Query Processing in Crowdsourcing Databases: A Task-Driven Approach

Crowd-selection is essential to crowdsourcing applications, since choosing the right workers with particular expertise to carry out specific crowdsourced tasks is extremely important. The central problem is simple but tricky: given a crowdsourced task, who is the right worker to ask? Currently, most existing work has mainly studied the problem of crowd-selection for simple crowdsourced tasks su...

متن کامل

Improving Quality of Crowdsourced Labels via Probabilistic Matrix Factorization

Quality assurance in crowdsourced annotation often involves having a given example labeled multiple times by different workers, then aggregating these labels. Unfortunately, the worker-example label matrix is typically sparse and imbalanced for two reasons: 1) the average crowd worker judges few examples; and 2) few labels are typically collected per example to reduce cost. To address this miss...

متن کامل

Crowdsourced Testing for Enterprises: Experiences

Crowdsourced testing is an emerging phenomenon in software testing which utilizes benefits of crowdsourcing for software testing. It brings cost and quality advantage with faster delivery of software products. Among all the software development activities, testing is one of the primary activities considered for crowdsourcing. Certain types of testing demand for testers from diverse ethnicity, c...

متن کامل

Reducing Error in Context-Sensitive Crowdsourced Tasks

The recent growth of global crowdsourcing platforms has enabled businesses to leverage the time and expertise of workers world-wide with low overhead and at low cost. In order to utilize such platforms, one must decompose work into tasks that can be distributed to crowd workers. To this end, platform vendors provide task interfaces at varying degrees of granularity, from short, simple microtask...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017